On the Performance of Kernel Classes

نویسنده

  • Shahar Mendelson
چکیده

where (Ω,μ) is a probability space. The kernel K is used to generate a Hilbert space, known as a reproducing kernel Hilbert space, whose unit ball is the class of functions we investigate. Recall that if K is a positive definite function K : Ω×Ω → R, then by Mercer’s Theorem there is an orthonormal basis (φi)i=1 of L2(μ) such that μ× μ almost surely, K(x,y) = ∑i=1 λiφi(x)φi(y), where (λi)i=1 is the sequence of eigenvalues of TK (arranged in a non-increasing order) and φi is the eigenvector corresponding to λi. Let HK be the set of functions of the form ∑i=1 aiK(xi, ·), where xi ∈ Ω and ai ∈ R satisfy that ∑i, j=1 aia jK(xi,x j) ≤ 1. One can show that this so-called kernel class HK is the unit ball in the reproducing kernel Hilbert space defined by the integral operator, and that for every f ∈ HK, ‖ f‖∞ ≤ ‖K‖∞. An alternative way to define the reproducing kernel Hilbert space is via the feature map. Indeed, if we define Φ : Ω → l2 by Φ(x) = (√ λiφi(x) ∞ i=1, then

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Composite Kernel Optimization in Semi-Supervised Metric

Machine-learning solutions to classification, clustering and matching problems critically depend on the adopted metric, which in the past was selected heuristically. In the last decade, it has been demonstrated that an appropriate metric can be learnt from data, resulting in superior performance as compared with traditional metrics. This has recently stimulated a considerable interest in the to...

متن کامل

A Geometry Preserving Kernel over Riemannian Manifolds

Abstract- Kernel trick and projection to tangent spaces are two choices for linearizing the data points lying on Riemannian manifolds. These approaches are used to provide the prerequisites for applying standard machine learning methods on Riemannian manifolds. Classical kernels implicitly project data to high dimensional feature space without considering the intrinsic geometry of data points. ...

متن کامل

Bivariate Extension of Past Entropy

Di Crescenzo and Longobardi (2002) has been proposed a measure of uncertainty related to past life namely past entropy. The present paper addresses the question of extending this concept to bivariate set-up and study some properties of the proposed measure. It is shown that the proposed measure uniquely determines the distribution function. Characterizations for some bivariate lifetime models a...

متن کامل

Online learning of positive and negative prototypes with explanations based on kernel expansion

The issue of classification is still a topic of discussion in many current articles. Most of the models presented in the articles suffer from a lack of explanation for a reason comprehensible to humans. One way to create explainability is to separate the weights of the network into positive and negative parts based on the prototype. The positive part represents the weights of the correct class ...

متن کامل

Discrimination of time series based on kernel method

Classical methods in discrimination such as linear and quadratic do not have good efficiency in the case of nongaussian or nonlinear time series data. In nonparametric kernel discrimination in which the kernel estimators of likelihood functions are used instead of their real values has been shown to have good performance. The misclassification rate of kernel discrimination is usually less than ...

متن کامل

Comparison of the Gamma kernel and the orthogonal series methods of density estimation

The standard kernel density estimator suffers from a boundary bias issue for probability density function of distributions on the positive real line. The Gamma kernel estimators and orthogonal series estimators are two alternatives which are free of boundary bias. In this paper, a simulation study is conducted to compare small-sample performance of the Gamma kernel estimators and the orthog...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 4  شماره 

صفحات  -

تاریخ انتشار 2003